378 research outputs found

    SonicDraw: a web-based tool for sketching sounds and drawings

    Get PDF
    We present SonicDraw, a web browser tool that lies in between a drawing and a sound design interface. Through this ambiguity we aim to explore new kinds of user interactions as the creative process can be led either by sound or visual feedback loops. We performed a user evaluation to assess how users negotiated the affordances of the system and how it supported their creativity. We measured the System Usability Scale (SUS), the Creativity Support Index (CSI) and conducted an inductive thematic analysis of qualitative feedback. Results indicate that users find SonicDraw a very easy and intuitive tool which fosters the exploration for new unexpected combinations of sounds and drawings. However, the tool seems to fail in engaging high-skilled musicians or drawers wanting to create more complex pieces. To infer knowledge about user interaction, we also propose a quantitative analysis of drawing dynamics. Two contrasting modes of interaction are likely occurring, one where sketches act as direct controls of sonic attributes (sound focus), and the other where sketches feature semantic content (e.g. a house) that indirectly controls sound (visual focus)

    User experience in an interactive music virtual reality system: An exploratory study

    Get PDF
    The Objects VR interface and study explores interactive music and virtual reality, focusing on user experience, understanding of musical functionality, and interaction issues. Our system offers spatio-temporal music interaction using 3D geometric shapes and their designed relationships. Control is provided by tracking of the hands, and the experience is rendered across a head-mounted display with binaural sound presented over headphones. The evaluation of the system uses a mixed methods approach based on semi-structured interviews, surveys and video-based interaction analysis. On average the system was positively received in terms of interview self-report, metrics for spatial presence and creative support. Interaction analysis and interview thematic analysis also revealed instances of frustration with interaction and levels of confusion with system functionality. Our results allow reflection on design criteria and discussion of implications for facilitating music engagement in virtual reality. Finally our work discusses the effectiveness of measures with respect to future evaluation of novel interactive music systems in virtual reality

    Co-design of a Smart Cajon

    Get PDF
    The work of Luca Turchet is supported by a Marie-Curie Individual fellowship of European Union’s Horizon 2020 research and innovation program, under grant agreement No. 749561. Mathieu Barthet also acknowledges support from the EU H2020 Audio Commons grant (688382)

    Moodplay: an interactive mood-based musical experience

    Get PDF

    Novel Methods in Facilitating Audience and Performer Interaction Using the Mood Conductor Framework

    Get PDF
    While listeners’ emotional response to music is the subject of numerous studies, less attention is paid to the dynamic emotion variations due to the interaction between artists and audiences in live improvised music performances. By opening a direct communication channel from audience members to performers, the Mood Conductor system provides an experimental framework to study this phenomenon. Mood Conductor facilitates interactive performances and thus also has an inherent entertainment value. The framework allows audience members to send emotional directions using their mobile devices in order to “conduct” improvised performances. Emotion coordinates indicted by the audience in the arousal-valence space are aggregated and clustered to create a video projection. This is used by the musicians as guidance, and provides visual feedback to the audience. Three different systems were developed and tested within our framework so far. These systems were trialled in several public performances with different ensembles. Qualitative and quantitative evaluations demonstrated that musicians and audiences were highly engaged with the system, and raised new insights enabling future improvements of the framework

    "It's cleaner, definitely": Collaborative Process in Audio Production.

    Get PDF
    Working from vague client instructions, how do audio producers collaborate to diagnose what specifically is wrong with a piece of music, where the problem is and what to do about it? This paper presents a design ethnography that uncovers some of the ways in which two music producers co-ordinate their understanding of complex representations of pieces of music while working together in a studio. Our analysis shows that audio producers constantly make judgements based on audio and visual evidence while working with complex digital tools, which can lead to ambiguity in assessments of issues. We show how multimodal conduct guides the process of work and that complex media objects are integrated as elements of interaction by the music producers. The findings provide an understanding how people currently collaborate when producing audio, to support the design of better tools and systems for collaborative audio production in the future

    Examining Emotion Perception Agreement in Live Music Performance

    Get PDF
    Current music emotion recognition (MER) systems rely on emotion data averaged across listeners and over time to infer the emotion expressed by a musical piece, often neglecting time- and listener-dependent factors. These limitations can restrict the efficacy of MER systems and cause misjudgements. In a live music concert setting, fifteen audience members annotated perceived emotion in valence-arousal space over time using a mobile application. Analyses of inter-rater reliability yielded widely varying levels of agreement in the perceived emotions. A follow-up lab study to uncover the reasons for such variability was conducted, where twenty-one listeners annotated their perceived emotions through a recording of the original performance and offered open-ended explanations. Thematic analysis reveals many salient features and interpretations that can describe the cognitive processes. Some of the results confirm known findings of music perception and MER studies. Novel findings highlight the importance of less frequently discussed musical attributes, such as musical structure, performer expression, and stage setting, as perceived across different modalities. Musicians are found to attribute emotion change to musical harmony, structure, and performance technique more than non-musicians. We suggest that listener-informed musical features can benefit MER in addressing emotional perception variability by providing reasons for listener similarities and idiosyncrasies

    Crossroads: Interactive Music Systems Transforming Performance, Production and Listening

    Get PDF
    date-added: 2017-12-22 18:26:58 +0000 date-modified: 2017-12-22 18:38:33 +0000 keywords: mood-based interaction, intelligent music production, HCI local-url: https://qmro.qmul.ac.uk/xmlui/handle/123456789/12502 publisher-url: http://mcl.open.ac.uk/music-chi/uploads/19/HCIMUSIC_2016_paper_15.pdf bdsk-url-1: https://qmro.qmul.ac.uk/xmlui/bitstream/handle/123456789/12502/Barthet%20Crossroads%3A%20Interactive%20Music%20Systems%202016%20Accepted.pdfdate-added: 2017-12-22 18:26:58 +0000 date-modified: 2017-12-22 18:38:33 +0000 keywords: mood-based interaction, intelligent music production, HCI local-url: https://qmro.qmul.ac.uk/xmlui/handle/123456789/12502 publisher-url: http://mcl.open.ac.uk/music-chi/uploads/19/HCIMUSIC_2016_paper_15.pdf bdsk-url-1: https://qmro.qmul.ac.uk/xmlui/bitstream/handle/123456789/12502/Barthet%20Crossroads%3A%20Interactive%20Music%20Systems%202016%20Accepted.pdfdate-added: 2017-12-22 18:26:58 +0000 date-modified: 2017-12-22 18:38:33 +0000 keywords: mood-based interaction, intelligent music production, HCI local-url: https://qmro.qmul.ac.uk/xmlui/handle/123456789/12502 publisher-url: http://mcl.open.ac.uk/music-chi/uploads/19/HCIMUSIC_2016_paper_15.pdf bdsk-url-1: https://qmro.qmul.ac.uk/xmlui/bitstream/handle/123456789/12502/Barthet%20Crossroads%3A%20Interactive%20Music%20Systems%202016%20Accepted.pdfWe discuss several state-of-the-art systems that propose new paradigms and user workflows for music composition, production, performance, and listening. We focus on a selection of systems that exploit recent advances in semantic and affective computing, music information retrieval (MIR) and semantic web, as well as insights from fields such as mobile computing and information visualisation. These systems offer the potential to provide transformative experiences for users, which is manifested in creativity, engagement, efficiency, discovery and affect

    Designing Computationally Creative Musical Performance Systems

    Get PDF
    This is work in progress where we outline a design process for a computationally creative musical performance system using the Creative Systems Framework (CSF). The proposed system is intended to produce virtuosic interpretations, and subsequent synthesized renderings of these interpretations with a physical model of a bass guitar, using case-based reasoning and reflection. We introduce our interpretations of virtuosity and musical performance, outline the suitability of case-based reasoning in computationally creative systems and introduce notions of computational creativity and the CSF. We design our system by formalising the components of the CSF and briefly outline a potential implementation. In doing so, we demonstrate how the CSF can be used as a tool to aid in designing computationally creative musical performance systems
    • …
    corecore